perm filename WINO[4,KMC]1 blob sn#065426 filedate 1973-10-30 generic text, type C, neo UTF8
COMMENT ⊗   VALID 00009 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	~ti:COURSE OUTLINE\CS 265 - LINGUISTICS 265\ \COMPUTER MODELS
C00003 00003	~he:Course Description~
C00006 00004	~he:Basic References~
C00011 00005	~he:Schedule of Topics~
C00013 00006	~he:Readings on Syntax~
C00019 00007	~he:Readings on Semantics~
C00025 00008	~he:References~
C00057 00009	~he:Preliminary topic list for CS 266 - Linguistics 266~
C00065 ENDMK
C⊗;
~ti:COURSE OUTLINE\CS 265 - LINGUISTICS 265\ \COMPUTER MODELS
\FOR NATURAL LANGUAGE~
Terry Winograd\Stanford University
~Fall Quarter 1973\{date}~

.select 1
.odd heading (%eTerry Winograd%*,%eStanford University - Fall Quarter 1973%*,%e{page}%*)
.even heading (%e{page}%*,%eCS 265 - Linguistics 265%*,%eCourse Outline%*)
~he:Course Description~
.begin center

%2CS 265
Linguistics 265%*

%bIntroduction to Computational Linguistics%*

Fall Quarter, 1973
Terry Winograd
Polya 256
X-3824

M-W-F 10
Undergraduate Library 144

.end

CS 265 (Linguistics 265) is intended as an introduction to computational
linguistics for people who have some familiarty with linguistics, computation,
or psychology.  It does not assume a strong background in any one
of them, but students who are not solidly grounded in the basics of
each are expected to fill in their knowledge through individual
reading.  The emphasis of the course will be on providing a 
conceptual framework for understanding current research in computational
formalisms for natural language.

The approach will center on a "common-sense" view of the problems
which arise in writing computer programs to understand language,
asking questions like "What knowledge would you need to understand
this piece of text?"
and "What is a reasonable representation for that knowledge?"
This will lead into many branches of linguistics,
computation theory, logic, and epistemology.  The lectures will
attempt to relate these various explorations, and suggest materials
for students who are interested on following up on any of them.

Two short "thought-papers" will be assigned in which students are
expected to read new material and relate it to the ideas we have
been discussing in class.  

For people interested in pursuing this material farther, CS266
(Linguistics 266) is intended as a continuation which will go
more deeply into individual topics of current research interest.
It will be limited in size and based on discussion of research
papers.  A tenative topic list is available.
~he:Basic References~

There is no single text which covers a substantial part of
the material in this course.  Much of it exists only in
current research papers, or scattered through books in
other subjects.  I will try to hand out as much as I can
in the form of xeroxed papers and notes.  I will list the
main sources, and try to make copies avaiable in the
various libraries.  There is too much to expect people to
buy everything, but people seriously interested in the subject
will probably want to have at least some of them.
I hope to be able to pass out those things I expect
everyone to read which aren't easily available.
.begin indent 6,10

~sub:Linguistics~

John Lyons, %eIntroduction to Theoretical Linguistics%*,
Cambridge Univ. Press, 1971 (paperback $3.95)
-- A throrough introduction to a variety of traditional linguistic
approaches.

John Lyons, %eNew Horizons in Linguistics%*, Pelican, 1970
(paperback 2.25) -- A more general overview of language studies.

~sub:Logic, Computation Theory, & Formal Linguistics~

Robert Wall, %eIntroduction to Mathematical Linguistics%*,
Prentice-Hall, 1972 (hardback) -- A good introduction to
a wide scope of relevant mathematics.  Other books are
more thorough in specific areas, but this is deep enough
for the purposes of the course.

~sub:Philosophy~
John Searle (ed.), %eThe Philosophy of Language%*, Oxford U.
1971 (paperback $1.95) -- An easily accessible collection
of readings bringing up some of the basic issues.

~sub:Artificial Intelligence~

Marvin Minsky (ed.), %eSemantic Information Processing%*,
MIT Press, 1968 (hardback) -- A collection of much of the
early work in natural language understanding, especially
at M.I.T.

Marvin Minsky and Seymour Papert, %eProgress Report:
Artificial Intelligence Laboratory%* MIT, AI-Memo 252,
1971 (available on request) -- A rather general overview
of the attitudes towards AI implicit in much of the
language work.

Terry Winograd, %eUnderstanding Natural Language%*, Academic
Press, 1972 (hardback)
[also appeared as the January 1972 issue of
%eCognitive Psychology%* and is a rewriting of my thesis
which was issued as AI-TR-17 and MAC-TR-84 at MIT] --
Describes a particular project, but contains lots of
general introductory material to the underlying
theories.

Kenneth Colby and Roger Schank (eds.) %eComputer Models of
Thought and Language%*, Freeman, 1973 (hopefully soon!! we may be able
to get copies of some of the proofs if it isn't out) --
A collection of much of the more recent AI natural language
work at a variety of places.

.end
~he:Schedule of Topics~
.begin nofill


%2Lecture #		Topic
(approximate)%*

1-2	Introduction and Overview
3-4	Simple grammar formalisms, transformational grammar
5-6	Case grammars, systemic grammar
7	Parsing strategies
8-11	The structure of English
12	Overview of semantics
13	Traditional approaches to semantics
14	Early computational approaches 
15	Formal logic, resolution
16	Languages for procedural representation
17	Integrating procedural and declarative knowledge
18	Lexical issues, primitives
19	Metaphor, analogy, "fuziness"
20	Time, space, quantification
21	Chunking of knowledge
22	Structure of discourse
23	Integration of the understanding process
24	Psycholinguistic models
25	Philosophical and meta-theoretical issues
26	Implications and directions for future work
27-30	    overflow
.end
~he:Readings on Syntax~
.macro int ⊂break;once indent 16,16,8⊃
.indent 8,12

~sub:I.  Basic ideas of grammar~
.once indent 6
%bPhrase structure, categorial, context-free formalizations%*

Lyons, %eIntroduction...%* chapters 4 and 6
.int
These chapters, like most of Lyons' book, give a basic clear picture
of the linguistic ideas.  They are valuable for picking up basic
vocabulary.

Dan Bobrow, "Syntactic analysis of English by Computer -- a survey"
%eProceedings, Fall Joint Computer Conference%*, 1963, pp. 365-387
.int
This paper is a quick, readable summary of most of the early
syntactic implementations.

.once indent 6
%bFormal languages, finite state and pushdown automata%*

Wall, %eIntroduction...%* chapters 9 and 10
.int
This book is good for a basic introduction to many aspects of
the relevant mathematics.  Anyone who has had a course in
these topics knows enough already.  Anyone who hasn't may
have to skim through some of the earlier chapters to be
straight on notation, but it's all stuff worth knowing.


~sub:II. Transformational grammar~
.int
This course does not go into transformational approaches in
any detail.  However it is important to understand how the
theories compare, and to have a basic transformational
literacy since it is the most widely understood formalism
for linguistics and meta-linguistic theory.  These readings
are for people who want to learn it in some detail.

Noam Chomsky,
%eSyntactic Structures%* (early version but easy to understand)
Noam Chomsky,
%eAspects of a Theory of Syntax%*
.int
This book is more advanced -- except for
the opening theoretical section it's only of interest if
you want all the gory details -- fairly tough going unless
you're versed in the theory already.

Bruce Liles, %eAn Introductory Transformational Grammar%*, Prentice
Hall, 1971
.int
This is probably the clearest introduction to transformational
theory and practice for beginners.

~sub:III. Systemic and case grammars~

Winograd, %eUnderstanding...%* section 1.4 --
Thesis sections 2.1 to 2.2.3 and 2.3.1 - 2.3.2
.int
There is no good introductory text for Systemic grammar.  These
sections give some introduction, but since writing them I've
come to a different view on what aspects should be emphasized.
Hopefully something better will be available soon.

M.A.K. Halliday, "Language structure and language function", in
Lyons,  %eNew Horizons...%*, pp. 140-165
.int
Halliday's style is difficult, and the paper is slightly confusing, but
it touches all the basic issues and is worth struggling through some.

Charles Fillmore, "The case for case" in Bach and Harms (eds.) %eUniversals
in Linguistic Theory%*, Holt, Rinehart Winston, 1968, pp. 1-90
.int
This paper was the first tying case formalisms to modern syntactic
practice.  It isn't necessary to plow through the whole thing,
but nothing since has really changed the basic issue presented.

~sub:IV. Parsing~

Ron Kaplan, "Augmented transition networks as psychological models of
sentence comprehension"  2nd IJCAI, British Computer Society, 1971
.int
Kaplan describes ATN parsers, and relates them to human parsing
performance.  It is a good description of the straightforward ATN
theory.  People interested in going further should see Kaplan's
more recent papers.

Winograd, %eUnderstanding...%* 2.2, 5.1, 5.3
-- Thesis 2.2, 2.5, 2.3.11
.int
There is no clearly defined body of theory in these sections,
but they give a good feeling for the way parsing is actually
done in the system, and how it relates to other parts of the
analysis.
Again, my views have changed somewhat and I hope to hand out
supplementary material.
~he:Readings on Semantics~
~sur:Traditional approaches and componential analysis~
.indent 8,12
.macro int ⊂break;once indent 16,16,8⊃
Lyons, %eIntroduction...%*, Chapters 9 and 10
.int
These chapters are a little long and dreary, but they are clear
and give a basic literacy in the underlying issues.

Manfred Bierwisch, Semantics, in Lyons %eNew Horizons...%*
.int
This introduction is sketchy, but presents the basic way of
connecting semantics to transformational syntax -- at least
as far as it had been developed.

~sur:Early computational approaches~

Joe Weizenbaum, ELIZA
.int
One of the earliest and most startling programs.  Millions
of people have seen it in action.  Everyone should know
how it works.

Bert Green et.al., BASEBALL, in Feigenbaum and Feldman, %eComputers and Thought%*
.int
Worth skimming to get a feeling for just how much it could handle --
it's surprisingly sophisticated considering how early it was.

Marvin Minsky, %eSemantic Information Processing%*
.int
In particular, the papers by Bobrow, and Raphael.  Again, the details
aren't important, but you should have a feel for the scope
of the things the programs could handle.

~sur:Formal Logic~
.int
I will assume a basic literacy in formal logic -- Boolean algebra,
simple propositional and predicate calculus, the notions
of rule of inference and proof
procedure,etc.  If you feel weak in this, read Wall, %eIntroduction...%*
Chapters 1 - 3.

~sur:Procedural Representations~

Winograd, Section 6 -- thesis 3.1, 3.3
.int
A readable introduction to the basic PLANNER ideas.

Bobrow and Raphael, Survey of AI Languages, %e3IJCAI%*
.int
This paper compares a variety of procedural AI representations.
It is a good introduction to serve as a leaping off point
into the individual manuals.

~sur:Integrating procedural and declarative knowledge~
.int
Good luck.  This is a wide open area, and I'll try to point to
things when we get there (maybe even have something written).

~sur:Lexical issues - primitives~

George Miller, English verbs of motion
.int
An attempt to map a simple lexical area.  Approach is not
basically theoretical, but it provides good grist for discussion.

Roger Schank, in Colby and Schank, %eComputer models...%*
.int
Schank's conceptual dependency theory places strong emphasis
on the existence of a small number of universal primitives.

~sur:Metaphor, analogy, fuziness~
.int
This is a fuzzy topic -- I hope to get together some recent papers.

~sur:Time, space, quantification~
.int
There are lots of stray issues here, and I haven't picked out
the exact things to cover yet.

~sur:Chunking of Knowledge~
.int
All of the things here represent parts of theories in development.
It isn't worth learning all the details of each, but hopefully
looking through them will give some basis for comparison and
finding common elements.

Gene Charniak, Jack and Janet... %e3IJCAI%*

Bob Abelson, in Colby and Shank (hopefully copies will be available)

Roger Schank,  papers in general

Wallace Chafe, First technical report...

~sur:Structure of Discourse~

Winograd, 8.2 -- thesis 4.3

John Searle, %eSpeech Acts%*
.int
Analyzes language as an act of communication, with relations
to appropriate situation.  The basic viewpoint is worth
knowing.

M.A.K. Halliday, Language structure... in Lyons %eNew Horizons...%*
.int
A beginning of a theory relating discourse phenomena to the
analysis of syntax.

~sur:Integration of the understanding process~

Winograd, Sections 1 and 8 -- thesis 1.1, 4

Schank et al., MARGIE, %e3IJCAI%*

~sur:Psycholinguistic models~
.int
to be decided

~sur:Philosophical and meta-theoretical issues~
.int
to be decided
~he:References~
.macro in5 ⊂break;once indent 10,5,5⊃
.fill
.indent 5,0
This list is not complete -- changes and additions will be
made later in the course, and I will try to indicate which
items are relevant to a given lecture.  The starred items
are those which are specifically referred to.  The others
are more general reference sources, or articles of less
importance.  The references %e1IJCAI, 2IJCAI, %1and%* 3IJCAI%*
are to the preprints of the International Joint Conference
on Artificial Intelligence, held in 1969, 1971, and 1973 
respectively.

.indent 1,5
.at "~*" ⊂once indent 0,5}*{ ⊃


~*Abelson, Robert, The Structure of Belief
Systems, to appear in Colby and Schank,
1973.
.in5
     Abelson tries to formalize how our knowledge
of actions and belief is structured.
He emphasizes the ways in which situations
are perceived in terms of a stock of generic
events and scripts.

~*Bobrow, Daniel, "Natural Language Input for a
Computer Problem Solving System," in Minsky,
1967, 133-215.
.in5
     Bobrow's STUDENT program to
solve word-problems in algebra was one of the first
language-understanders to use a sophisticated problem
solving model as the basis for understanding.  His thesis
(reprinted in SIP) is quite readable.

Bruce, Bertram, A Model for Temporal References
and Its Application in a Question Answering Program,
%eArtificial Intelligence%*
3 (1972), pp. 1-25.

Chafe, Wallace,
First Technical Report, Contrastive Semantics
Project, Dept. of Linguistics, Berkeley, Oct. 1972.
.in5
     Chafe talks about the process of "linguisticization"
from internal concepts to surface structure.  The main
interest is in the structuring of knowledge into
"subconceptualizations", which correspond
loosely to Abelson's scripts, Minsky's frames,
etc.

~*Charniak, Eugene, Jack and Janet in Search of a Theory of
Knowledge, %e3IJCAI%*, pp. 337-343.
.in5
This short paper explains some of the basic ideas developed
in Charniak's thesis on children's story understanding.

Charniak, Eugene, Towards a Model of Children's Story
Comprehension, MIT AI-TR 266, December, 1972.
.in5
     Charniak's thesis tackles the problem of
organizing the knowledge of an area bigger than
the toy worlds like blocks.  He emphasizes the
ways in which the knowledge is centered around
different concepts, organized around
"demons" which are triggered by certain
patterns of input, and serve to organize
further parts of the story as it is told.

~*Colby, Kenneth, L. Tesler, and H. Enea,
Experiments with a Search Algorithm
for the Data Base of a Human Belief Structure,
%e1IJCAI%*, 1969..
.in5
     Colby is concerned with belief systems -- with
modelling the fact that humans place different
amounts of faith
in different pieces of knowledge.  His model
is simplistic, but tries to deal with the question
of how these belief values are created and changed.

     
Derksen, Jan, J.F. Rulifson, and R.J. Waldinger,
The QA4 Language applied to Robot Planning,
%eFJCC%* 1972, pp. 1181-1187.
.in5
     QA4 is a general-purpose deductive
language much like PLANNER, developed
by Rulifson at SRI.

Feigenbaum, Edward, and Julian Feldman,
%eComputers and Thought,%*
McGraw Hill, 1963.
.in5
This was one of the earliest AI collections.  The article
by Lindsay is on natural language, and several others bring
up relevant issues of representation.

Fikes, Richard, and Nils Nilsson, STRIPS:
A New Approach to The Application of Theorem
Proving to Problem Solving,
%e2IJCAI%*, 1971, pp. 608-620
.in5
     STRIPS is a Planner-like problem solving language
developed for the robot project at SRI. It
is somewhat more oriented towards a theorem-proving approach,
but includes procedural ideas.

~*Fillmore, Charles, The Case for Case, in Bach
and Harms (eds.)
%eUniversals in
Linguistic Theory,%*
Holt, Rinehart, and Winston, 1968.
.in5
     Along with
other linguists like Lakoff, Fillmore is turning
now towards questions of meaning, away from the more
purely syntactic approach.  Also as with other
linguists, most of his interesting stuff seems to
be in unpublished mimeographs, including a set
of 6 lectures he gave recently, entitled
"May we come in?", "Space," "Time,"
"Coming and Going," and "Diexis I," and
"Deixis II."

~*Green, Bert, Alice Wolf, Carol Chomsky, and Kenneth Laughery, BASEBALL: An Automatic Question Answerer, in
Feigenbaum and Feldman (1963), pp. 207-216.
.in5
     BASEBALL was one of the first
question-answering programs.  It handled fairly complex
questions by limiting itself to a very specific and
highly structured set of knowledge (results of baseball games).

~*Green, Cordell, "Application of Theorem Proving to Problem
Solving," 
%e1IJCAI%*, 1969.
.in5
     A good summary of the advantages and difficulties of
programming AI in a predicate calculus formalism.

Hayakawa, S.I.,
%eLanguage in
Thought and Action%*,

Hewitt, Carl, Description and Theoretical Analysis
(Using Schemata) of PLANNER, MIT
AI-TR 258, April 1972.
.in5
     This is Hewitt's thesis which tells everything
you always wanted to know about PLANNER.
It's tough reading, but serves as a reference for
all of the different ideas in its development.  

Hewitt, Carl, A Universal Modular ACTOR Formalism for Artificial
Intelligence, %e3IJCAI%*, pp. 235-245.
.in5
     This paper is Hewitt at his most far-flung.  It has lots and
lots of interesting ideas which have implications for all AI and
language representations in particular.  They aren't laid out to
make understanding easy, though.  It's worth struggling through
this paper more than once, several months apart. You're likely to
find that something you've just struggled through inventing was
in there all along (or maybe you're just reading it into it,
who knows?).

~*Katz, J.J., and J.A. Fodor, The Structure
of a Semantic Theory, in Fodor and Katz, (eds.)
%eThe Structure
of Language,%*
Prentice Hall, 1964.
.in5
     Fodor and Katz popularized a type of componential analysis
and made connections between it and syntactic (transformational)
theory.  This paper is early, but no major ideas were
developed in this direction later.

~*Lakoff, George, On Generative
Semantics, in
%eSemantics%*, Steinberg and Jakobovits,
(eds.), Cambridge U., 1971.
.in5
     Lakoff is the leading exponent of generative
semantics.  This paper describes many of the basic
disagreements between this theory and the
transformational orthodoxy.  Some of his recent
papers (mostly unpublished) are more interesting,
showing more and more concern with issues
of meaning similar to AI approaches.  One recent
one is "Hedges".

Lindsay, Peter, and Donald Norman,
%eHuman Information Processing,%*
Academic Press, 1972.
.in5
     An excellent textbook introducing a variety
of nice to know stuff from the annals of psychology
(particularly perception).  Chapters
10 (The Structure of Memory), 11 (Memory Processes),
and 12 (Language) are most relevant to this course.
They give a somewhat simplistic but clear approach to
how knowledge can be represented.

~*Lindsay, Robert, Inferential Memory as the Basis
of Machines which Understand Natural Language,
in Feigenbaum and Feldman (1963), pp. 217-235.
.in5
     SAD SAM was another early language program which used
family tree structures to provide a base for limited
understanding of English sentences.

~*Lyons, John, %eIntroduction
to Theoretical Linguistics,%*
Cambridge Univ., 1971 (especially chapters
9 and 10).

Lyons, John, %eStructural Semantics,%*
Oxford: Blackwell, 1963.
.in5
     Lyons tries to formalize some of the notions of meaning
like "opposite", "incompatible", etc.  It is not in a computational
style, but many of the observations are basic to understanding
how words convey meaning.  The theory is explained more briefly
in his text (see above).

MacKay, Donald, %eInformation,
Mechanism, and Meaning,%*
MIT, 196?.
.in5
     MacKay tries to apply the ideas of information theory
to problems of human language use.  It is fairly abstract,
but some of the connections are interesting.

Malinowski, Bronislaw, %eCoral
Gardens and
Their Magic,%*
London, 1935.
.in5
     An early study of the conceptual structures
used by different cultures, as indicated by
linguistic and social activities.

~*Martin, Bill, Rand Krumland, and Alex Sunguroff,
MAPL, A Language for Describing Models of
the World, and More MAPL: Specifications
and Basic Structures, Internal memos 6 and 8,
Automatic Programming Group, MIT, 1972.
.in5
     These papers describe the notation underlying
the semantic representations used for the world of
business model being developed for the
automatic programming project at MIT.  

~*McCarthy, John, and Pat Hayes,
Some Philosophical
Problems from the Standpoint of Artificial
Intelligence, in
%eMachine Intelligence ?,%*
Edinburgh, 19??.

.in5
     This paper relates some classical philosophical problems
to specific problems of representations for AI.  It stays
within a logician-oriented framework, but brings up a number
of issues which must be dealt with by any AI system.

McCawley, James, Where do Noun Phrases Come
From?, in Steinberg and Jakobovits (eds.), %eSemantics%*,
.in5
     McCawley, along with Lakoff etc. is in
Generative Semantics -- pretty heavy linguistic
reading.

McDermott, Drew, Assimilation of New Information
Into a Natural-Language Understanding System,
MIT Masters Thesis, 1972 (to appear
as an AI Memo soon).
.in5
     McDermott deals with the problem of filling in plausible
deductions needed to understand a simple scenario being described
in natural language.  The paper worries
about the representations needed to handled common-sense
facts like those about space, and the type
of deductive processes needed for understanding.

McDermott, Drew, and Gerald Sussman,
The Conniver Reference Manual, AI-Memo 259 , 1972
.in5
     A reference for the Conniver language -- should
be read in conjunction with the Sussman-McDermott
paper comparing Planner and Conniver.

~*Miller, George, English Verbs of Motion:
A Case Study in Semantics and Lexical Memory,
in Melton and Martin (eds.)
%eCoding
Processes in Human
Memory%*
Winston, 1972.
.in5
     Starting from a psycholinguistics viewpoint,
Miller tries to analyze the semantic structure of a set
of verbs indicating motion.  It isn't clear just
how the ideas would be used in something
as explicit as an understanding program, but he
points out many of the regularities found in 
meanings of words.

~*Minsky, Marvin, Matter, Mind, and Models, in
Minsky (ed.), %eSemantic
Information Processing,%*
MIT, 1967.
.in5
     A short paper clearly delineating the separation
between the world, a prson's model of it, his use of the model,
his model of the model, etc.

Moore, Robert, DSCRIPT, A Computational Theory of Descriptions, %e3IJCAI%*,
pp. 223-229.  
.in5
     Moore tackles some of the classical problems
of reference from a computational framework, combining
some of the ideas of lambda calculus with more conventional
quantification.

Morris, Charles,
%eSigns, Language,
and Behavior.%*

Nevins, Arthur, A Human Oriented Logic for
Automatic Theorem Proving, MIT AI Memo 268.
October, 1972.
.in5
     Nevins is trying to combine the advantages of
uniform notation and proof mechanisms with a more
natural deductive style.

Newell, Allen, and H.A. Simon,
%eHuman Problem Solving.%*
Prentice Hall,
1972.
.in5
     This mammoth work does not deal explicitly with
the semantic problems of this course, but the general
approach to representing human knowledge is very wide-ranging,
and can be applied at least indirectly.  Reading just
the Introduction and Concluding theoretical
chapters gives an overview, but to get
a clear idea of how their notions work, it
is necessary to dip into the other chapters.

Ogden, C.K., and I.A. Richards,
%eThe Meaning of Meaning,%*
Harcourt Brace, 1956.

Postman, Neil, and Charles Weingartner,
%eLinguistics, a
Revolution in
Teaching%*, Delta
(Dell) 1966.
.in5
     An excellent book explaining what linguistics
is and how it should be used in teaching.  Aimed at
high school teachers, but full of good simple stuff
and healthy scepticism.

Pratt, Vaughan, A Linguistics Oriented Programming Language,
%e3IJCAI%*, pp. 372-382.
.in5
     LINGOL is a language which makes it
easy to implement simple models of language processing.

Quillian, M. Ross, The
Teachable Language Comprehender,
%eCACM%* 12:8 (August 1969) pp. 459-475.
.in5
     This paper tries to combine the associative ideas
of Quillian's earlier work with various kinds of
logical capability.  As a result,
it does not have as clear a thread of ideas, but
shows well the kind of problems which
arise in trying to get different aspects of
understanding to interact.

~*Quillian, M. Ross., "Semantic Memory,"
in Minsky, 1967, 216-270.
.in5
     One of the earliest language systems, Semantic
Memory deals only with the associative aspects
of semantic structure.  It has been widely read
and has served as a basis for psychological models.

Quine, Willard,
%eWord and Object,%*
MIT, 1960.
.in5
     A Philosopher's formulation of a
behaviorist approach to meaning.  Brings up most
of the basic philosophical issues.

~*Raphael, Bertram, SIR, A Computer Program for Semantic Information
Retrieval, in Minsky 1967, pp. 33 - 145.
.in5
     SIR was an early language-undersanding program which handled
a variety of straightforward logical problems, but was limited
by the degree to which each additional type of knowledge required
reprogramming.  Raphael spends a good part of the thesis discussing
a proposed better system in which logical relations are handled
more uniformly (as they were in later programs).

Robinson, J.A.,
A Machine Oriented Logic Based on the Resolution
Principle, %eJACM%* 12:1 (January 1965) pp. 23-41.
.in5
    Resolution is a uniform proof procedure for
the predicate calculus, whose characteristics make it very
suitable for computer use.  It forms the basis
for many of the systems which use a logical representation.

~*Rumelhart, David, Peter
Lindsay, and Donald Norman,
A Process Model for Long Term Memory,
in Tulving and Donaldson (eds.)
%eOrganization
of Memory,%*
Academic Press, 1972.
.in5
     This paper presents a model which is halfway between
being psychology and AI.  It
tries to base the representation on
what seems reasonable for human memory, and also
tries to be specific about how operations
could be carried out on it.  It isn't easy
to evaluate how well it succeeds, but has
some worthwile ideas and clarifications.
Some other papers in the volume are
worth looking at, particularly, Kintsch, Collins
and Quillian, and Tulving.

Salton, Gerald, %eThe
SMART Retrieval
System -- Experiments
in Automatic
Document Processing,%*
Prentice Hall, 1971.

~*Sandevall, Eric, Representing
Natural-language Information in Predicate Calculus,
Stanford AI-Memo 128, July 1970.
.in5
     Sandevall is one of the people who has worked most
at trying to get a wide
variety of English constructions represented in
predicate calculus in a natural way.  This paper
is a summary of some of the ideas.

~*Schank, Roger, Neil Goldman, Charles Rieger, and Chris Riesbeck,
MARGIE: Memory, Analysis, Response Generation, and Inference on English,
%e3IJCAI%*, 1973, pp. 255-261.
.in5
A description of a program based on Schank's conceptual analysis
ideas.  The program makes paraphrases of sentences given it in
English, integrating the various parts of language understanding.

~*Schank, Roger,
Finding the Conceptual Content
and Intention in an Utterance in Natural
Language Conversation, %e2IJCAI%* 1971,
pp. 444-454.
.in5
     Schank has written a number of papers on
conceptual dependency representations for
semantics.  This one swings through most of the
basic ideas, and gives references to a number
of the others.

~*Searle, J.R.,
%eSpeech Acts,%*
Cambridge, 1970.

Searle, J.R., (ed.) %eThe
Philosophy of Language,%*
Oxford, 1971.

Shannon, Claude, and Warren Weaver,
%eThe Mathematical
Theory of Communication,%*
Illinois U., 1963
.in5
     The classic work by the founders of information theory.

Steinberg, , Danny, and Leon Jakobovits,
%eSemantics%*, Cambridge Univ. Press, 1971
.in5
A good collection of rather difficult papers on semantics
from the traditional philosophical viewpoint, and from
a transformational grammar viewpoint.  A good place to
get an understanding of the current debate involving
generative semantics.

Skinner, B.F.,
%eVerbal Behavior,%*
Appleton-Century-Crofts,
1957.
.in5
     The attempt to describe language from a pure
behaviorist approach.  Chomsky's review of this book
is better and more widely read than the book itself.

Stone, Philip, D. Dunphy, M. Smith, and D. Ogilvie,
%eThe General Inquirer,%*
MIT, 1966.

Suppes, Pat, %eIntroduction
to Logic,%*
Van Nostrand, 1957. (especially first 4 chapters).

Sussman, G., T. Winograd, and E. Charniak,
Micro-Planner Reference Manual,
MIT AI Memo 203, July 1970.
.in5
     Micro-Planner is the only available 
implementation of the PLANNER language.
It is oversimplified, but has most
of the basic things (at least as they existed
when it was written).  The manual includes
a readable introduction.

~*Sussman, Gerald, and
Drew McDermott, From PLANNER to
Conniver: A Genetic Approach, %eProceedings
of FJCC%*, 1972 (also available as "Why Conniving
is better than Planning", MIT AI-Memo 255a)
.in5
     A discussion of the problem solving strategies
encouraged by PLANNER, and description
of Conniver, a deductive language which is Planner-like,
but with somewhat different orientation.

~*Weizenbaum, J., ELIZA, %eCACM%*
1966, 9, 36-45.

Whorf, Benjamin,
%eLanguage Thought and
Reality%*, MIT, 1956.
.in5
     The classic work on the relationship between the
language and thought of a culture.  Hypothesises
that our world view is greatly shaped by
the language we have available to describe it.

~*Wilks, Yorick, Understanding Without Proofs,
%e3IJCAI%*, 1973, pp. 270 - 277.

~*Winograd, Terry, %eUnderstanding
Natural Language,%*
Academic Press, 1972.

~*Woods, William, Procedural Semantics for a
Question Answer Machine, %eProc. FJCC%*, 1968,
pp. 457-471.
.in5
     Wood's system is one of the first large scale language
understanders which combines syntax, semantics, and
a limited kind of logical system to handle
a large subset of English.

~he:Preliminary topic list for CS 266 - Linguistics 266~
.skip 2
.begin center
Winter Quarter, 1973-4
Terry Winograd
.end

.fill
.indent 5,0
CS 266 (Linguistics 266) is intended as an advanced course for
students whose primary research interest is in the area of language.
There is no particular set of prerequisites, but each member
of the group should bring to it a sophisticated understanding of
language from some point of view as a result of work in a related
field such as linguistics, psychology, or artificial intelligence.

We will meet for one three-hour session each week to discuss and
critically analyze a set of readings.  These readings will be
extensive and will be drawn from a variety of sources in the
current literature.  The emphasis will be on work which could
be classified as "semantics", starting from a computational
point of view, but trying to integrate other approaches.
We will not expect to use the class time
to explicate or present material, assuming that each person
will carefully read and assimilate it as a prerequisite to the
discussion.

The topic list will be flexible and will depend on the interests
of the participants.  What follows is a set of preliminary
suggestions.

.once center
%2Possible topics for discussion%*

The Goals of Linguistic Theory
.int
What should a theory of linguistics include?  What does it even
mean to be a theory?  What is the current orthodoxy on these
questions?  What direction is it moving?  References include
Chomsky, Lakoff, Kuhn, Schank.

The competence/performance distinction
.int
Is this a meaningful distinction?  Where do AI programs fit in?
How does this interact with the syntax/semantics distinction?
References include Fodor and Garrett, Bever, Schank

New formalisms for syntax
.int
How do various syntactic theories relate -- case grammars,
systemic grammars, stratificational grammars, transformational
grammars, etc.  What computational implications do they have?
References include Lamb, Halliday, Fillmore.

Generative semantics
.int
What is really going on in the current dispute?  Is there a
way of relating these issues to an AI framework for linguistic
theory?  References include Chomsky, Lakoff, McCawley, Ross.

Flexible formalisms
.int
How can we avoid the rigidity usually associated with
computational definitions.  What kind of "fuzzy" concepts
can be included in language-understanding programs?
How much to probabalistic notions help?  What else is
possible?  References include Lakoff, Abrahamson, Tversky.

Language Acquisition
.int
Are there any interesting computational models of language
learning?  Are there any good psychological theories?
What are the relevant criteria?  References include Brown,
Carol Chomsky, McNeil.

Formalisms from Logic
.int
Which formalisms from logic are useful for representing
natural language?  What is going on in modal logic,
and Montague semantics?  What are the computational implications
of these approaches?  References include Partee, Lewis, Montague,
Kripke, etc.


Semantic primitives for language 
.int
What are the underlying primitives on which our language
ability is built? How universal are they?
Is there a small coherent set?  How
can they be discovered?  How are they combined?  What
kinds of problems are involved in a reductionist
view?  References include Schank, Miller, etc.

Discourse structure
.int
What is the macro-structure of language?  How do the
context factors (both linguistic and situational)
determine the linguistic structure?  How can these be
formalized?  References include Halliday, Searle,
etc.

Epistemology - The representation problem
.int
How much of our theory is determined by a choice of
basic representation for knowledge?  What sorts of
formalisms are available for different areas?  How
do they relate?  What restrictions do they impose
on what can be expressed?  What are the computational
limitations?  References include McCarthy, Hewitt,
etc.


Psycholinguistic research
.int
What is the relation between AI-type linguistic modelling,
linguistic theory, and actual experimental results.  What
work is being done which has direct implications for 
computational models?  What experiments do current models
suggest?  What are the basic problems with a simple notion
of experimental verification of theories?  References
include Anderson and Bower, Collins and Quillian, Abrahamson and
Rumelhart, etc.  Maybe we could get people from the psychology
dept. into this discussion.